SynonymX
Speak
Markov Process
Definition of Markov Process
1.
A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
Noun
Synonyms for word "Markov process"
markoff process
Semanticaly linked words with "Markov process"
markoff chain
markov chain
stochastic process
Hyponims for word "Markov process"
stochastic process